100 research outputs found

    Inital Starting Point Analysis for K-Means Clustering: A Case Study

    Get PDF
    Workload characterization is an important part of systems performance modeling. Clustering is a method used to find classes of jobs within workloads. K-Means is one of the most popular clustering algorithms. Initial starting point values are needed as input parameters when performing k-means clustering. This paper shows that the results of the running the k-means algorithm on the same workload will vary depending on the values chosen as initial starting points. Fourteen methods of composing initial starting point values are compared in a case study. The results indicate that a synthetic method, scrambled midpoints, is an effective starting point method for k-means clustering

    A Case Study on Grid Performance Modeling

    Get PDF
    The purpose of this case study is to develop a performance model for an enterprise grid for performance management and capacity planning1. The target environment includes grid applications such as health-care and financial services where the data is located primarily within the resources of a worldwide corporation. The approach is to build a discrete event simulation model for a representative work-flow grid. Five work-flow classes, found using a customized k-means clustering algorithm characterize the workload of the grid. Analyzing the gap between the simulation and measurement data validates the model. The case study demonstrates that the simulation model can be used to predict the grid system performance given a workload forecast. The model is also used to evaluate alternative scheduling strategies. The simulation model is flexible and easily incorporates several system details

    Capacity Planning of a Commodity Cluster in an Academic Environment: A Case Study

    Get PDF
    In this paper, the design of a simulation model for evaluating two alternative supercomputer configurations in an academic environment is presented. The workload is analyzed and modeled, and its effect on the relative performance of both systems is studied. The Integrated Capacity Planning Environment (ICPE) toolkit, developed for commodity cluster capacity planning, is successfully applied to the target environment. The ICPE is a tool for workload modeling, simulation modeling, and what-if analysis. A new characterization strategy is applied to the workload to more accurately model commodity cluster work- loads. Through what-if analysis, the sensitivity of the baseline system performance to workload change, and also the relative performance of the two proposed alternative systems are compared and evaluated. This case study demonstrates the usefulness of the methodology and the applicability of the tools in gauging system capacity and making design decisions

    Capacity Planning of a Commodity Cluster in an Academic Environment: A Case Study

    Get PDF
    Abstract. In this paper, the design of a simulation model for evaluating two alternative supercomputer configurations in an academic environment is presented. The workload is analyzed and modeled, and its effect on the relative performance of both systems is studied. The Integrated Capacity Planning Environment (ICPE) toolkit, developed for commodity cluster capacity planning, is successfully applied to the target environment. The ICPE is a tool for workload modeling, simulation modeling, and what-if analysis. A new characterization strategy is applied to the workload to more accurately model commodity cluster workloads. Through "what-if" analysis, the sensitivity of the baseline system performance to workload change, and also the relative performance of the two proposed alternative systems are compared and evaluated. This case study demonstrates the usefulness of the methodology and the applicability of the tools in gauging system capacity and making design decisions

    IVOA Recommendation: Sky Event Reporting Metadata Version 2.0

    Full text link
    VOEvent defines the content and meaning of a standard information packet for representing, transmitting, publishing and archiving information about a transient celestial event, with the implication that timely follow-up is of interest. The objective is to motivate the observation of targets-of-opportunity, to drive robotic telescopes, to trigger archive searches, and to alert the community. VOEvent is focused on the reporting of photon events, but events mediated by disparate phenomena such as neutrinos, gravitational waves, and solar or atmospheric particle bursts may also be reported. Structured data is used, rather than natural language, so that automated systems can effectively interpret VOEvent packets. Each packet may contain zero or more of the "who, what, where, when & how" of a detected event, but in addition, may contain a hypothesis (a "why") regarding the nature of the underlying physical cause of the event. Citations to previous VOEvents may be used to place each event in its correct context. Proper curation is encouraged throughout each event's life cycle from discovery through successive follow-ups. VOEvent packets gain persistent identifiers and are typically stored in databases reached via registries. VOEvent packets may therefore reference other packets in various ways. Packets are encouraged to be small and to be processed quickly. This standard does not define a transport layer or the design of clients, repositories, publishers or brokers; it does not cover policy issues such as who can publish, who can build a registry of events, who can subscribe to a particular registry, nor the intellectual property issues

    Non-invasive stimulation of the social brain: the methodological challenges

    Get PDF
    Use of non-invasive brain stimulation methods (NIBS) has become a common approach to study social processing in addition to behavioural, imaging and lesion studies. However, research using NIBS to investigate social processing faces challenges. Overcoming these is important to allow valid and reliable interpretation of findings in neurotypical cohorts, but also to allow us to tailor NIBS protocols to atypical groups with social difficulties. In this review, we consider the utility of brain stimulation as a technique to study and modulate social processing. We also discuss challenges that face researchers using NIBS to study social processing in neurotypical adults with a view to highlighting potential solutions. Finally, we discuss additional challenges that face researchers using NIBS to study and modulate social processing in atypical groups. These are important to consider given that NIBS protocols are rarely tailored to atypical groups before use. Instead, many rely on protocols designed for neurotypical adults despite differences in brain function that are likely to impact response to NIBS

    Neuroimaging the consciousness of self: Review, and conceptual-methodological framework

    Get PDF
    We review neuroimaging research investigating self-referential processing (SRP), that is, how we respond to stimuli that reference ourselves, prefaced by a lexical-thematic analysis of words indicative of “self-feelings”. We consider SRP as occurring verbally (V-SRP) and non-verbally (NV-SRP), both in the controlled, “top-down” form of introspective and interoceptive tasks, respectively, as well as in the “bottom-up” spontaneous or automatic form of “mind wandering” and “body wandering” that occurs during resting state. Our review leads us to outline a conceptual and methodological framework for future SRP research that we briefly apply toward understanding certain psychological and neurological disorders symptomatically associated with abnormal SRP. Our discussion is partly guided by William James’ original writings on the consciousness of self

    Effects of fluoxetine on functional outcomes after acute stroke (FOCUS): a pragmatic, double-blind, randomised, controlled trial

    Get PDF
    Background Results of small trials indicate that fluoxetine might improve functional outcomes after stroke. The FOCUS trial aimed to provide a precise estimate of these effects. Methods FOCUS was a pragmatic, multicentre, parallel group, double-blind, randomised, placebo-controlled trial done at 103 hospitals in the UK. Patients were eligible if they were aged 18 years or older, had a clinical stroke diagnosis, were enrolled and randomly assigned between 2 days and 15 days after onset, and had focal neurological deficits. Patients were randomly allocated fluoxetine 20 mg or matching placebo orally once daily for 6 months via a web-based system by use of a minimisation algorithm. The primary outcome was functional status, measured with the modified Rankin Scale (mRS), at 6 months. Patients, carers, health-care staff, and the trial team were masked to treatment allocation. Functional status was assessed at 6 months and 12 months after randomisation. Patients were analysed according to their treatment allocation. This trial is registered with the ISRCTN registry, number ISRCTN83290762. Findings Between Sept 10, 2012, and March 31, 2017, 3127 patients were recruited. 1564 patients were allocated fluoxetine and 1563 allocated placebo. mRS data at 6 months were available for 1553 (99·3%) patients in each treatment group. The distribution across mRS categories at 6 months was similar in the fluoxetine and placebo groups (common odds ratio adjusted for minimisation variables 0·951 [95% CI 0·839–1·079]; p=0·439). Patients allocated fluoxetine were less likely than those allocated placebo to develop new depression by 6 months (210 [13·43%] patients vs 269 [17·21%]; difference 3·78% [95% CI 1·26–6·30]; p=0·0033), but they had more bone fractures (45 [2·88%] vs 23 [1·47%]; difference 1·41% [95% CI 0·38–2·43]; p=0·0070). There were no significant differences in any other event at 6 or 12 months. Interpretation Fluoxetine 20 mg given daily for 6 months after acute stroke does not seem to improve functional outcomes. Although the treatment reduced the occurrence of depression, it increased the frequency of bone fractures. These results do not support the routine use of fluoxetine either for the prevention of post-stroke depression or to promote recovery of function. Funding UK Stroke Association and NIHR Health Technology Assessment Programme

    Value through innovation in long-term service delivery: facility management in an Australian PPP

    Get PDF
    Purpose: Public-private partnerships (PPPs) and other innovative procurement mechanisms are frequently used to deliver both an asset and a public service over a protracted period. The value streams to the parties involved can be complex, but generally arise from the satisfactory provision of infrastructure that is fit for purpose throughout its life. This research aims to investigate the effectiveness of the facility management (FM) function in delivering long-term value to both the client and consortium. Design/methodology/approach: This paper describes a case study of a PPP in Australia that delivered social infrastructure in multiple locations to a state government. Drawing upon multiple perspectives from within the consortium, it utilises inductive principles to identify the influences on value generation through innovation by the FM function. Findings: The ability of an Australian FM contractor to provide value within a PPP context has been shown to reflect some of the attributes described in literature. However, the extent of innovation, especially in the design and construction phases, has been limited by organisational history and capability, and relational and contextual issues. Originality/value: This research highlights a flaw in the rhetoric relating to PPP delivery, namely the disconnection between the asset delivery and service delivery phases, which stifles the consortium's capacity to innovate and maximise value. It reveals a set of influences that both resonate with the literature and plausibly explain the suboptimal performance of the FM function within an Australian PPP. By using highly iterative analysis leading to within-case generalisability, it provides a robust basis for wider investigation of the problem
    • 

    corecore